A comparison between steepest descent and non-linear conjugate gradient algorithms for binding energy minimization of organic molecules

نویسندگان

چکیده

Abstract The main intention of optimization is to bring about the “best” any model by prioritizing needs along with a given set constraints. There wide range problems, among which, unfortunately, problems that are formulated from nature not convex in nature. Solving non-convex quite trickier than conventional method derivatives. One such problem Computing minimum value binding free energy various molecules. Minimization molecule highly significant field Molecular mechanics which foundation computational biology. For molecule, refers amount needed separate an individual particle system particles or disseminate all system. significance it can be used compute lowest conformation, corresponds least Steric energy. Hence, this paper aims at computing organic molecules isolated conditions using steepest descent algorithm and conjugate gradient comparing them.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonsymmetric Preconditioning for Conjugate Gradient and Steepest Descent Methods1

We numerically analyze the possibility of turning off postsmoothing (relaxation) in geometric multigrid when used as a preconditioner in conjugate gradient linear and eigenvalue solvers for the 3D Laplacian. The geometric Semicoarsening Multigrid (SMG) method is provided by the hypre parallel software package. We solve linear systems using two variants (standard and flexible) of the preconditio...

متن کامل

Steepest Descent and Conjugate Gradient Methods with Variable Preconditioning

We analyze the conjugate gradient (CG) method with variable preconditioning for solving a linear system with a real symmetric positive definite (SPD) matrix of coefficients A. We assume that the preconditioner is SPD on each step, and that the condition number of the preconditioned system matrix is bounded above by a constant independent of the step number. We show that the CG method with varia...

متن کامل

Steepest descent with momentum for quadratic functions is a version of the conjugate gradient method

It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method, is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so called learning rate and momentum param...

متن کامل

Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns

The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...

متن کامل

Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models

Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good prop...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of physics

سال: 2023

ISSN: ['0022-3700', '1747-3721', '0368-3508', '1747-3713']

DOI: https://doi.org/10.1088/1742-6596/2484/1/012004